安装hive包
- 将apache-hive-0.13.1-bin.tar.gz使用SFTP工具上传到spark1的/usr/local目录下。
解压缩hive安装包:
1
tar -zxvf apache-hive-0.13.1-bin.tar.gz
重命名hive目录:
1
mv apache-hive-0.13.1-bin hive
配置hive相关的环境变量
1
2
3
4vi .bashrc
export HIVE_HOME=/usr/local/hive
export PATH=$HIVE_HOME/bin
source .bashrc
安装mysql
- 在spark1上安装mysql
使用yum安装mysql server
1
2
3yum install -y mysql-server
service mysqld start
chkconfig mysqld on使用yum安装mysql connector
1
yum install -y mysql-connector-java
将mysql connector拷贝到hive的lib包中
1
cp /usr/share/java/mysql-connector-java-5.1.17.jar /usr/local/hive/lib
在mysql上创建hive元数据库,并对hive进行授权
1
2
3
4
5
6create database if not exists hive_metadata;
grant all privileges on hive_metadata.* to 'hive'@'%' identified by 'hive';
grant all privileges on hive_metadata.* to 'hive'@'localhost' identified by 'hive';
grant all privileges on hive_metadata.* to 'hive'@'spark1' identified by 'hive';
flush privileges;
use hive_metadata;
配置hive-site.xml
1 | mv hive-default.xml.template hive-site.xml |
更改如下项目:1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://spark1:3306/hive_metadata?createDatabaseIfNotExist=true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hive</value>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/user/hive/warehouse</value>
</property>
配置hive-env.sh和hive-config.sh
1 | mv hive-env.sh.template hive-env.sh |
验证hive是否安装成功
直接输入hive命令,可以进入hive命令行